119 research outputs found

    Simulation and Measurement of Multispectral Space Debris Light Curves

    Get PDF
    The accumulation of space debris has become one of the greatest threats facing the space industry to date. Through an increasing amount of objects deposited in Earth's orbit, such as rocket bodies, defunct satellites and general debris fragments, space missions are exposed to a growing risk of collisions. Moreover, the recent surge in commercial space applications is expected to further contribute to the problem. At the Institute of Technical Physics of Deutsches Zentrum für Luft- und Raumfahrt (DLR) in Stuttgart, resident space objects are monitored using a number of telescopes through active laser and passive sunlight illumination. Due to the high altitude and relatively small size of the objects they generally appear as unresolved points in photometric images. An object's temporal variation in brightness is referred to as a light curve and implies key information concerning the object's shape, material composition and rotation. Recovering these parameters from light signals is not trivial and it is anticipated that additional information provided by multispectral observations will contribute to a more reliable characterization of space debris. This research covers the development of a physically based simulation to model multispectral light reflections from space debris. The software is targeted towards ground-based observations and is expected to form an integral part in facilitating future strategies for comprehensive collision avoidance and space debris removal. Both passive light curves and laser ranging measurements are simulated using three-dimensional satellite models. To improve the accuracy of simulations, spectral lab measurements of common space materials are incorporated into the render. Further, the process of gathering reference measurements using the DLR's 43 cm telescope at the Uhlandshöhe Forschungsobservatorium is presented. For the comparison between synthetic and empirical light curves, a detailed calibration of the optical system is performed. The validity of the light curve simulator is confirmed the on the basis of recordings obtained from radar calibration targets. Finally, simulated data is used to study benefits of multispectral observations for characterization and parameter estimation from space debris

    Sentencing as craftwork and the binary epistemologies of the discretionary decision process

    Get PDF
    This article contends that it is time to take a critical look at a series of binary categories which have dominated the scholarly and reform epistemologies of the sentencing decision process. These binaries are: rules versus discretion; reason versus emotion; offence versus offender; normative principles versus incoherence; aggravating versus mitigating factors; and aggregate/tariff consistency versus individualized sentencing. These binaries underpin both the 'legal-rational' tradition (by which I mean a view of discretion as inherently suspect, a preference for the use of philosophy of punishment justifications and an explanation of the decision process through factors or variables), and also the more recent rise of the 'new penology'. Both approaches tend to rely on 'top-down' assumptions of change, which pay limited attention to the agency of penal workers. The article seeks to develop a conception of sentencing craftwork as a social and interpretive process.1 In so doing, it applies and develops a number of Kritzer's observations (in this issue) about craftwork to sentencing. These craftwork observations are: problem solving (applied to the rules - discretion and reason - emotion dichotomies); skills and techniques (normative penal principles and the use of cognitive analytical assumptions); consistency (tariff versus individualized sentencing); clientele (applied to account giving and the reality of decision making versus expression). By conceiving of sentencing as craftwork, the binary epistemologies of the sentencing decision process, which have dominated (and limited) the scholarly and policy sentencing imaginations, are revealed as dynamic, contingent, and synergistic. However, this is not to say that such binaries are no more than empty rhetoric concealing the reality of the decision process. Rather, these binaries serve as crucial legitimating reference points in the vocabulary of sentencing account giving

    Human well‐being and climate change mitigation

    Get PDF
    Climate change mitigation research is fundamentally motivated by the preservation of human lives and the environmental conditions which enable them. However, the field has to date rather superficial in its appreciation of theoretical claims in well‐being thought, with deep implications for the framing of mitigation priorities, policies, and research. Major strands of well‐being thought are hedonic well‐being—typically referred to as happiness or subjective well‐being—and eudaimonic well‐being, which includes theories of human needs, capabilities, and multidimensional poverty. Aspects of each can be found in political and procedural accounts such as the Sustainable Development Goals. Situating these concepts within the challenges of addressing climate change, the choice of approach is highly consequential for: (1) understanding inter‐ and intra‐generational equity; (2) defining appropriate mitigation strategies; and (3) conceptualizing the socio‐technical provisioning systems that convert biophysical resources into well‐being outcomes. Eudaimonic approaches emphasize the importance of consumption thresholds, beyond which dimensions of well‐being become satiated. Related strands of well‐being and mitigation research suggest constraining consumption to within minimum and maximum consumption levels, inviting normative discussions on the social benefits, climate impacts, and political challenges associated with a given form of provisioning. The question of how current socio‐technical provisioning systems can be shifted towards low‐carbon, well‐being enhancing forms constitutes a new frontier in mitigation research, involving not just technological change and economic incentives, but wide‐ranging social, institutional, and cultural shifts

    The CMS Phase-1 pixel detector upgrade

    Get PDF
    The CMS detector at the CERN LHC features a silicon pixel detector as its innermost subdetector. The original CMS pixel detector has been replaced with an upgraded pixel system (CMS Phase-1 pixel detector) in the extended year-end technical stop of the LHC in 2016/2017. The upgraded CMS pixel detector is designed to cope with the higher instantaneous luminosities that have been achieved by the LHC after the upgrades to the accelerator during the first long shutdown in 2013–2014. Compared to the original pixel detector, the upgraded detector has a better tracking performance and lower mass with four barrel layers and three endcap disks on each side to provide hit coverage up to an absolute value of pseudorapidity of 2.5. This paper describes the design and construction of the CMS Phase-1 pixel detector as well as its performance from commissioning to early operation in collision data-taking.Peer reviewe

    Wasserstoffsuperoxyd als Lösungsmittel

    No full text

    Spectral Light Curve Simulation for Parameter Estimation from Space Debris

    No full text
    Characterisation of space debris has become a fundamental task to facilitate sustainable space operations. Ground-based surveillance provides the means to extract key attributes from spacecraft. However, signal inversion attempts are generally under-constrained, which is why an increase in measurement channels through multispectral observations is expected to benefit parameter estimation. The current approach to simulating space debris observation at the Institute of Technical Physics of the German Aerospace Centre (DLR) in Stuttgart relies on monochromatic images taken from the POV-Ray render engine to form light curve signals. Rendered scenes are generated based on the location of an observer by propagating a target’s orbit and rotation. This paper describes the simulation of spectral light curves through the extension of DLR’s Raxus Prime simulation environment. Light reflections are computed using the Mitsuba2 spectral render engine, while atmospheric attenuation is accounted for by the radiative transfer library libRadTran. A validation of the simulator was achieved using multispectral measurements, carried out at the Uhlandshöhe research observatory in Stuttgart. Measured and synthetic data were found to be in agreement based on an RMS error <1% of the total measured signal count. Further, simulated spectral products were used to determine a target’s surface material composition and rotation state and examine aspects of laser ranging to non-cooperative targets

    Technology as empowerment: a capability approach to computer ethics

    No full text
    Abstract. Standard agent and action-based approaches in computer ethics tend to have difficulty dealing with complex systems-level issues such as the digital divide and globalisation. This paper argues for a value-based agenda to complement traditional approaches in computer ethics, and that one value-based approach well-suited to technological domains can be found in capability theory. Capability approaches have recently become influential in a number of fields with an ethical or policy dimension, but have not so far been applied in computer ethics. The paper introduces two major versions of the theory – those advanced by Amartya Sen and Martha Nussbaum – and argues that they offer potentially valuable conceptual tools for computer ethics. By developing a theory of value based on core human functionings and the capabilities (powers, freedoms) required to realise them, capability theory is shown to have a number of potential benefits that complement standard ethical theory, opening up new approaches to analysis and providing a framework that incorporates a justice as well as an ethics dimension. The underlying functionalism of capability theory is seen to be particularly appropriate to technology ethics, enabling the integration of normative and descriptive analysis of technology in terms of human needs and values. The paper concludes by considering some criticisms of the theory and directions for further development. Key words: capability theory, computer ethics, empowerment, freedom, justice, value
    corecore